Minimum complexity density estimation
نویسندگان
چکیده
The minimum complexity or minimum description-length criterion developed by Kolmogorov, Rissanen, Wallace, So&in, and others leads to consistent probability density estimators. These density estimators are defined to achieve the best compromise between likelihood and simplicity. A related issue is the compromise between accuracy of approximations and complexity relative to the sample size. An index of resolvability is studied which is shown to bound the statistical accuracy of the density estimators, as well as the informationtheoretic redundancy. Index Terms -Kolmogorov complexity, minimum description-length criterion, universal data compression, bounds on redundancy, resolvability of functions, model selection, density estimation, discovery of probability laws, consistency, statistical convergence rates.
منابع مشابه
From -entropy to KL-entropy: Analysis of Minimum Information Complexity Density Estimation
We consider an extension of -entropy to a KL-divergence based complexity measure for randomized density estimation methods. Based on this extension, we develop a general information theoretical inequality that measures the statistical complexity of some deterministic and randomized density estimators. Consequences of the new inequality will be presented. In particular, we show that this techniq...
متن کاملRobust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کاملCorrection to 'Minimum Complexity Density Estimation'
Manuscript submitted July 1991. A. R. Barron is with the Department of Statistics and the Department of Electrical and Computer Engineering, University of Illinois, 101 Illini Hall, 725 S. Wright Street, Champaign, IL. 91820. T. M. Cover is with the Departments of Electrical Engineering and Statistics, Durand, Room 121, Stanford University, Stanford, CA 94305. IEEE Log Number 9103053. ‘A. R. Ba...
متن کاملMDL Histogram Density Estimation
We regard histogram density estimation as a model selection problem. Our approach is based on the information-theoretic minimum description length (MDL) principle, which can be applied for tasks such as data clustering, density estimation, image denoising and model selection in general. MDLbased model selection is formalized via the normalized maximum likelihood (NML) distribution, which has se...
متن کاملInformation-Theoretically Optimal Histogram Density Estimation
We regard histogram density estimation as a model selection problem. Our approach is based on the information-theoretic minimum description length (MDL) principle. MDLbased model selection is formalized via the normalized maximum likelihood (NML) distribution, which has several desirable optimality properties. We show how this approach can be applied for learning generic, irregular (variable-wi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Information Theory
دوره 37 شماره
صفحات -
تاریخ انتشار 1991